# Japanese-English bilingual
ELYZA Shortcut 1.0 Qwen 32B
Apache-2.0
ELYZA-Shortcut-1.0-Qwen-32B is a non-reasoning model developed based on Qwen2.5-32B-Instruct, capable of bypassing reasoning steps to directly generate final answers.
Large Language Model
Transformers Supports Multiple Languages

E
elyza
172
2
Elvenmaid 12B V2
ElvenMaid-12B-v2 is a 12B parameter language model based on the ChatML format, created by merging multiple pre-trained models using mergekit's TIES method, supporting English and Japanese interactions.
Large Language Model
Transformers Supports Multiple Languages

E
yamatazen
50
4
Llama 3 VNTL Yollisa 8B GGUF
This is an 8B parameter model based on the Llama-3 architecture, specializing in translation and processing of visual novels and Japanese otaku media content.
Large Language Model Supports Multiple Languages
L
mradermacher
53
1
Llama 3.1 Swallow 70B Instruct V0.3
Llama 3.1 Swallow is a series of large language models built on Meta Llama 3.1. It enhances Japanese language capabilities through continuous pre-training while retaining English language capabilities.
Large Language Model
Transformers Supports Multiple Languages

L
tokyotech-llm
1,659
12
Llama 3.1 Swallow 8B Instruct V0.3
Llama 3.1 Swallow is a series of large language models built on Meta Llama 3.1. It enhances Japanese capabilities through continuous pre-training while retaining English capabilities.
Large Language Model
Transformers Supports Multiple Languages

L
tokyotech-llm
16.48k
20
Mistral Nemo Japanese Instruct 2408
Apache-2.0
This is a Japanese continuous pre-trained model based on Mistral-Nemo-Instruct-2407, focusing on Japanese text generation tasks.
Large Language Model
Safetensors Supports Multiple Languages
M
cyberagent
1,898
39
Llama 3 Swallow 8B Instruct V0.1
A Japanese-optimized large language model built on Meta Llama 3, enhancing Japanese capabilities through continuous pre-training and improving instruction-following abilities through supervised fine-tuning.
Large Language Model
Transformers Supports Multiple Languages

L
tokyotech-llm
13.88k
20
Shisa V1 Llama3 8b
A Japanese-optimized large language model fine-tuned based on Meta-Llama-3-8B-Instruct, excelling in multiple Japanese benchmark tests
Large Language Model
Transformers

S
shisa-ai
28
6
Fugaku LLM 13B
Other
Fugaku-LLM is a domestically produced large language model in Japan, pre-trained from scratch using the supercomputer 'Fugaku.' It boasts high transparency and security, with particularly outstanding performance in Japanese.
Large Language Model
Transformers Supports Multiple Languages

F
Fugaku-LLM
25
123
Swallow MS 7b Instruct V0.1
Apache-2.0
Japanese-enhanced large language model continuously pre-trained based on Mistral-7B-v0.1
Large Language Model
Transformers Supports Multiple Languages

S
tokyotech-llm
48
14
ELYZA Japanese Llama 2 13b Fast Instruct
A Japanese-optimized model based on Llama 2, designed specifically to enhance the Japanese interaction experience.
Large Language Model
Transformers Supports Multiple Languages

E
elyza
1,109
23
ELYZA Japanese Llama 2 13b Instruct
ELYZA-japanese-Llama-2-13b is a model based on Llama 2 with additional pre-training to enhance Japanese language capabilities.
Large Language Model
Transformers Supports Multiple Languages

E
elyza
1,022
40
ELYZA Japanese Llama 2 7b Fast Instruct
A language model based on Llama2 architecture with extended Japanese capabilities through additional pre-training
Large Language Model
Transformers Supports Multiple Languages

E
elyza
1,576
75
ELYZA Japanese Llama 2 7b Instruct
A language model based on the Llama2 architecture with extended Japanese capabilities through additional pre-training
Large Language Model
Transformers Supports Multiple Languages

E
elyza
5,917
67
Featured Recommended AI Models